140 research outputs found

    Higher-order principal component analysis for the approximation of tensors in tree-based low-rank formats

    Full text link
    This paper is concerned with the approximation of tensors using tree-based tensor formats, which are tensor networks whose graphs are dimension partition trees. We consider Hilbert tensor spaces of multivariate functions defined on a product set equipped with a probability measure. This includes the case of multidimensional arrays corresponding to finite product sets. We propose and analyse an algorithm for the construction of an approximation using only point evaluations of a multivariate function, or evaluations of some entries of a multidimensional array. The algorithm is a variant of higher-order singular value decomposition which constructs a hierarchy of subspaces associated with the different nodes of the tree and a corresponding hierarchy of interpolation operators. Optimal subspaces are estimated using empirical principal component analysis of interpolations of partial random evaluations of the function. The algorithm is able to provide an approximation in any tree-based format with either a prescribed rank or a prescribed relative error, with a number of evaluations of the order of the storage complexity of the approximation format. Under some assumptions on the estimation of principal components, we prove that the algorithm provides either a quasi-optimal approximation with a given rank, or an approximation satisfying the prescribed relative error, up to constants depending on the tree and the properties of interpolation operators. The analysis takes into account the discretization errors for the approximation of infinite-dimensional tensors. Several numerical examples illustrate the main results and the behavior of the algorithm for the approximation of high-dimensional functions using hierarchical Tucker or tensor train tensor formats, and the approximation of univariate functions using tensorization

    Low-rank approximate inverse for preconditioning tensor-structured linear systems

    Full text link
    In this paper, we propose an algorithm for the construction of low-rank approximations of the inverse of an operator given in low-rank tensor format. The construction relies on an updated greedy algorithm for the minimization of a suitable distance to the inverse operator. It provides a sequence of approximations that are defined as the projections of the inverse operator in an increasing sequence of linear subspaces of operators. These subspaces are obtained by the tensorization of bases of operators that are constructed from successive rank-one corrections. In order to handle high-order tensors, approximate projections are computed in low-rank Hierarchical Tucker subsets of the successive subspaces of operators. Some desired properties such as symmetry or sparsity can be imposed on the approximate inverse operator during the correction step, where an optimal rank-one correction is searched as the tensor product of operators with the desired properties. Numerical examples illustrate the ability of this algorithm to provide efficient preconditioners for linear systems in tensor format that improve the convergence of iterative solvers and also the quality of the resulting low-rank approximations of the solution

    A tensor approximation method based on ideal minimal residual formulations for the solution of high-dimensional problems

    Full text link
    In this paper, we propose a method for the approximation of the solution of high-dimensional weakly coercive problems formulated in tensor spaces using low-rank approximation formats. The method can be seen as a perturbation of a minimal residual method with residual norm corresponding to the error in a specified solution norm. We introduce and analyze an iterative algorithm that is able to provide a controlled approximation of the optimal approximation of the solution in a given low-rank subset, without any a priori information on this solution. We also introduce a weak greedy algorithm which uses this perturbed minimal residual method for the computation of successive greedy corrections in small tensor subsets. We prove its convergence under some conditions on the parameters of the algorithm. The residual norm can be designed such that the resulting low-rank approximations are quasi-optimal with respect to particular norms of interest, thus yielding to goal-oriented order reduction strategies for the approximation of high-dimensional problems. The proposed numerical method is applied to the solution of a stochastic partial differential equation which is discretized using standard Galerkin methods in tensor product spaces

    Geometric Structures in Tensor Representations (Final Release)

    Get PDF
    The main goal of this paper is to study the geometric structures associated with the representation of tensors in subspace based formats. To do this we use a property of the so-called minimal subspaces which allows us to describe the tensor representation by means of a rooted tree. By using the tree structure and the dimensions of the associated minimal subspaces, we introduce, in the underlying algebraic tensor space, the set of tensors in a tree-based format with either bounded or fixed tree-based rank. This class contains the Tucker format and the Hierarchical Tucker format (including the Tensor Train format). In particular, we show that the set of tensors in the tree-based format with bounded (respectively, fixed) tree-based rank of an algebraic tensor product of normed vector spaces is an analytic Banach manifold. Indeed, the manifold geometry for the set of tensors with fixed tree-based rank is induced by a fibre bundle structure and the manifold geometry for the set of tensors with bounded tree-based rank is given by a finite union of connected components. In order to describe the relationship between these manifolds and the natural ambient space, we introduce the definition of topological tensor spaces in the tree-based format. We prove under natural conditions that any tensor of the topological tensor space under consideration admits best approximations in the manifold of tensors in the tree-based format with bounded tree-based rank. In this framework, we also show that the tangent (Banach) space at a given tensor is a complemented subspace in the natural ambient tensor Banach space and hence the set of tensors in the tree-based format with bounded (respectively, fixed) tree-based rank is an immersed submanifold. This fact allows us to extend the Dirac-Frenkel variational principle in the framework of topological tensor spaces.Comment: Some errors are corrected and Lemma 3.22 is improve

    Principal bundle structure of matrix manifolds

    Full text link
    In this paper, we introduce a new geometric description of the manifolds of matrices of fixed rank. The starting point is a geometric description of the Grassmann manifold Gr(Rk)\mathbb{G}_r(\mathbb{R}^k) of linear subspaces of dimension r<kr<k in Rk\mathbb{R}^k which avoids the use of equivalence classes. The set Gr(Rk)\mathbb{G}_r(\mathbb{R}^k) is equipped with an atlas which provides it with the structure of an analytic manifold modelled on R(k−r)×r\mathbb{R}^{(k-r)\times r}. Then we define an atlas for the set Mr(Rk×r)\mathcal{M}_r(\mathbb{R}^{k \times r}) of full rank matrices and prove that the resulting manifold is an analytic principal bundle with base Gr(Rk)\mathbb{G}_r(\mathbb{R}^k) and typical fibre GLr\mathrm{GL}_r, the general linear group of invertible matrices in Rk×k\mathbb{R}^{k\times k}. Finally, we define an atlas for the set Mr(Rn×m)\mathcal{M}_r(\mathbb{R}^{n \times m}) of non-full rank matrices and prove that the resulting manifold is an analytic principal bundle with base Gr(Rn)×Gr(Rm)\mathbb{G}_r(\mathbb{R}^n) \times \mathbb{G}_r(\mathbb{R}^m) and typical fibre GLr\mathrm{GL}_r. The atlas of Mr(Rn×m)\mathcal{M}_r(\mathbb{R}^{n \times m}) is indexed on the manifold itself, which allows a natural definition of a neighbourhood for a given matrix, this neighbourhood being proved to possess the structure of a Lie group. Moreover, the set Mr(Rn×m)\mathcal{M}_r(\mathbb{R}^{n \times m}) equipped with the topology induced by the atlas is proven to be an embedded submanifold of the matrix space Rn×m\mathbb{R}^{n \times m} equipped with the subspace topology. The proposed geometric description then results in a description of the matrix space Rn×m\mathbb{R}^{n \times m}, seen as the union of manifolds Mr(Rn×m)\mathcal{M}_r(\mathbb{R}^{n \times m}), as an analytic manifold equipped with a topology for which the matrix rank is a continuous map

    Tensor-based multiscale method for diffusion problems in quasi-periodic heterogeneous media

    Full text link
    This paper proposes to address the issue of complexity reduction for the numerical simulation of multiscale media in a quasi-periodic setting. We consider a stationary elliptic diffusion equation defined on a domain DD such that D‾\overline{D} is the union of cells {Di‾}i∈I\{\overline{D_i}\}_{i\in I} and we introduce a two-scale representation by identifying any function v(x)v(x) defined on DD with a bi-variate function v(i,y)v(i,y), where i∈Ii \in I relates to the index of the cell containing the point xx and y∈Yy \in Y relates to a local coordinate in a reference cell YY. We introduce a weak formulation of the problem in a broken Sobolev space V(D)V(D) using a discontinuous Galerkin framework. The problem is then interpreted as a tensor-structured equation by identifying V(D)V(D) with a tensor product space RI⊗V(Y)\mathbb{R}^I \otimes V(Y) of functions defined over the product set I×YI\times Y. Tensor numerical methods are then used in order to exploit approximability properties of quasi-periodic solutions by low-rank tensors.Comment: Changed the choice of test spaces V(D) and X (with regard to regularity) and the argumentation thereof. Corrected proof of proposition 3. Corrected wrong multiplicative factor in proposition 4 and its proof (was 2 instead of 1). Added remark 6 at the end of section 2. Extended remark 7. Added references. Some minor improvements (typos, typesetting
    • …
    corecore